Information Sheet
Developing from JDTA's using Content Types, Action Verbs and the Platform Option

Introduction:
You can perform different parts of the E2E process at the same time, or apply information that you have gathered earlier in the process to answer questions needed in later stages of the ISD / SAT approach. Each phase of the process is important, but understanding how these processes work together is the key for making the tool work for you. This information sheet will show what those connection points look like and how they work together.

References:
File Not Found
File Not Found


Information:
Example 1: JDTA to CPM Projects (Learning Objectives using different
File Not Found
File Not Found
, Action Verbs and/or Platform options).
  • As JDTA is looks like this ..... (see figure 1 and 2 for an example)
    • Duty - Operate Sonar
      • Task - Operate Sonar Equipment - KSATR - Knowledge = Different types of Arrays, Projectors, Hydrophones. Describe SONAR.
        • Sub-task - Operate Transmission Sub-system / Equipment
          • Step - Change to Continuous Wave
          • Step - Change Frequency Modulation
          • Step - Change Frequency Pulse Rate
          • Step - Operate in Single Mode
          • Step - Operate in Multi-beam combinations
          • Step - Operate in Omnidirectional Transmission (ODT)
        • Sub-task - Operate Reception Equipment
        • Sub-task - Operate in Passive Mode
      • Task - Interrupt SONAR Presentations - KSATR - Knowledge = Perform Equations
        • Sub-Task - Determine range
        • Sub-Task - Determine bearing
        • Sub-Task - Interrupt Principle Display Elements
  • Converting JDTA data to Learning Objectives in CPM Projects (Learning Objectives are made-up of 5 fields minimum)
      1. Condition - "Given appropriate technical documentation and procedures"
      2. Platform (Platform, System, Sub-System, Component and Non-Equipment) - "with an Active/Passive Sonar System" or "BSY-1A"
      3. Action Verb - "Operate"
      4. Free Text (will auto-populate with JDTA Task Text if linked) - Sonar Equipment
      5. Standard - "by analyzing waveforms, determining range, bearing and estimated speed with 80% accuracy."
  • Converting JDTA to a Learning Objective and Selecting a Content Type:
    • Operate Sonar Equipment (CPM Project) (EXAMPLE)
    • Operate SONAR Equipment (supported by different
      File Not Found
      File Not Found
      ), some examples are;
      • Principle only (using the Fact and other Elements to support Knowledge requirements; "like analyze waveform")
      • Principle for the task, multiple Procedures for the sub-task items to address Operating the Equipment (like Transmission) (using the Fact and other Elements to support Knowledge requirements)
      • Procedure to support the Operate SONAR Equipment, and a Concept (Content Type) to support Describe the Operation of SONAR Equipment.
    • Operate (Active Sonar, BSY-1A, DDG-51) SONAR Equipment (examples of using Platform, System, Sub-System Component, Non-Equipment),
    • Operate (Action Verb) can remain Operate, but also used to create "Analyze the Waveforms", "Determine Range, Bearing and Speed", "Demonstrate the Operation"
NOTE: Many different combinations using different Content Types are possible, the selection of which is driven by analysis of the work being performed, the amount of information that needs to be covered and the assessment strategy need to measure achievement of those requirements.

whats_wrong_w_JDTA1.jpgJohn_sonar_example_short.jpg
Figure 1 - JDTA Before <--------------------------------------------------------------------------------> Figure 2 - JDTA After it was fixed (Better Example - Focused Task Statements) (Bad Example - besides Knowledge based, the number of JDTA Task)



John_sonar_example_a4.jpg
Figure 3 - Organization of JDTA data to support the final output of the FEA (as it would be seen in the JDTA)

NOTE: JDTA's are about the work, but they setup Learning Objectives (Performance Based) and the Assessment Strategy.
  • Example 1 - This approach works just like the example 2, but has two JDTA task instead of just one. While in this example saving of a single task might not be significant, when your JDTA goes into 500 or 1000 task, managing your JDTA and focusing on the critical task becomes more important.
  • Example 2 - This approach works if the designer understands the flexibility of using the CPM Data tab to tailor JDTA's during the development of Learning Objectives. This JDTA construct can be used for any SONAR related JDTA, this approach is not dependent on platform, system, environment. This approach creates the R3, since the Learning Objective construct will now all come from a single JDTA task and all of the foundational knowledge and skills will be nearly identical with other course requirements. So using this flexibility you can take "Operate SONAR System"
    • can be tailored to read Operate "a DDG-51 class" (by using the platform option) SONAR System
    • can be tailored to read Operate "AN/AQS-20A Minehunting" or an Active" (by using the system or non-equipment/environment option) SONAR System
    • can be tailored to read Operate "BSY-1A" (by using the sub-system option) SONAR System
    • can even be tailored to read "Describe" (by changing the action verb) the operation of a DDG-51 class SONAR System
  • Example 3 - This sub-task work requirement is identical to Example 4, only difference is the level defined during the JDTA, the disadvantage with this approach is you only have steps under sub-task and you lose the flexibility of KSATR's if you need to expand or provide more work breakdown to support other Learning Objectives
  • Example 4 - Gives you the benefit of flexibility by having sub-task, steps and KSATRs. CPM works best when learning objectives are built off of JDTA Task statements.
  • Example 5 - Most importantly gives you the option of defining KSATR's for indirectly support requirements.
  • NOTE: All of the examples in Figure 3 are correct and maybe used to meet specific requirements in defining the work, in most cases the second example (2 and 4) provide more flexibility, but they also require more planning to accomplish. Please note that in examples 1 & 2 and 3 & 4 that you would have one or the other in a JDTA, not both.



John_sonar_example_a2.jpg
Figure 4 - Breakdown of JDTA example in Figure 3 converted to a Course / Module / Lesson / Section Construct in CPM Projects (amount of training material or level being taught)

NOTE: Both examples are linked to the same JDTA data. The above color codes on the ICON's to the left do have meaning.
  • Example 1 - Because of this design construct (in this example 1) the ELO's are designed for a KPL2 to SPL2 assessment approach. The designer would probably expect one of the following conditions:
    • Based on analysis that students have the prerequisite knowledge and skills to perform the task at the required proficiency levels
    • The job task can be fully completed in an hour or less for each section (that's notional - it might be up to 2 hours for each section)
    • Might just be a short demonstration and/or high level presentation.
  • Example 2 - Very similar approach that was used in NAVEDTRA 131 (lesser extent 130), build lessons to support KPL1, a different lesson for KPL2 - SPL1 and a skills based lesson for practice and final performance assessment. Example 2 provides the most flexibility to cover large amounts of material and use the Learning Object (LO) Module effectively.
  • NOTE - Neither approach is wrong and would support training requirements, the difference is how the Student can be evaluated and assessed to determine mastery of required knowledge and skills to perform the work.



John_sonar_example_a3.jpg
Figure 5 - Breakdown of JDTA example in Figure 3 converted to a Course / Module / Lesson / Section Construct in CPM Projects (Lesson / Section Construct)
NOTE: Both examples are linked to the same JDTA data.
  • Example 1 - The linkage of your JDTA to Learning Objectives and the CPM Project has an effect on how your training will be developed, more importantly how it will be assessed. Remember training is built around your Section level (Enabling Learning Objective). The Section level has the training content, the practice and the assessment. So your assessment is built around that ELO. If you need to assess other key performance indicators defined by the JDTA, then your Section level ELO's must be aligned. So it is important to stress the work requirement drives the learning objective, which drives the assessment strategy, which drives which or what combinations of content types that you will select inside a lesson.
  • Example 2 - This approach just allows greater flexibility to assess the Student mastery of more critical performance requirements. In example 2, the JDTA Task has been elevated to a Lesson Level (Terminal Learning Objective) with supporting ELO's of all the required performance measurements. Using this approach the Instructor / Learning Site can better evaluate the Student and ensure they have a solid foundation.
  • NOTE - Neither approach is wrong and would support training requirements, the difference is how the Student can be evaluated and assessed to determine mastery of required knowledge and skills to perform the work.



John_sonar_example_a1.jpg
Figure 6 - Breakdown of JDTA example in Figure 3 converted to a Course / Module / Lesson / Section Construct in CPM Projects (Assessments)

NOTE: This example still uses the same JDTA Duty, the only difference is the JDTA data is associated with a section (ELO) or lesson (TLO) between the two examples.
NOTE: The different colors associated to the icons on the left (above) show how the project is being linked to the JDTA. Go here to see chart of color codes.
  • Example 1 - This example just places the assessment as part of a lesson, it could be renamed to support a knowledge test (KPL1 or KPL2). But the bottom-line is it shows were the assessment will take place in the overall course construct and that will be reflected in the Course Master Schedule (CMS).
  • Example 2 - In the old NAVEDTRA constructs (130 / 131) skill-based lessons, labs or performance driven events are normally found in a separate lesson. The Sections under this lesson are identical to the sections seen in example 1. They are just organized differently to show you how they would reflect on the Course Master Schedule (CMS). Since CPM currently does not create a "Testing Plan" like NAVEDTRA 130 / 131, showing the assessment strategy inside the CMS is critical.
  • NOTE: Both examples are correct. One main factor for which example is best, is how it should be reflected in the Course Master Schedule (CMS). If it is a lesson level assessment or practical lab exercise it can be a different section. If the plan is to go to lab an practice and assess after covering all the knowledge then that's more accurately reflected by example 2.